seo

The Cassandra Memorandum: Google in 2013

Apollo fell in love with a priestess, and offered her the terrible gift of prophecy. She agreed to the gift, but when Apollo asked her to lie with him, the daughter of Priam refused. The God, angry, cursed her: the young priestess would have been a prophetess, but no one would believe her.

Her name was Cassandra.

cassandra_by_anthony_frederick_augustus_sandys

Every day this month, I’ve seen Twitter posts with every kind of predictions about how web marketing disciplines will look in 2013.

I am not exempt from the desire to predict the future. The urge is something deeply human and, in an industry as uncertain as ours, sometimes it is a psychological necessity. However, some of those predictions are so improbable that this obscure prediction (written to be blatantly false by one of my Spanish friends) seems more plausible:

“Google launches Google Balrog. The name of this new Google algorithm is inspired by the name of the mythical creature imagined by Tolkien, because it will operate the same way.

It will be an algorithm that, wrapped in fire, will crawl the Web, penalizing and incinerating sites which do not include the anchor text “click here” at least seven times and not include a picture of a kitten asleep in a basket.

If your site will not meet these minimums, the Balrog will go after you.” (The translation is mine from Ricardo`s original post in Spanish.)

Every speculation about how something may evolve in the future should be based on the knowledge of the past, and, in the case of Google, we should not make the mistake of excluding elements like its acquisitions and technological evolution when predicting its future.

For example, Panda should be seen as a needed action that Google took in order to solve a problem caused by the technological advancement of Caffeine. In fact, with Caffeine (June 2010), Google was able to find new pages (or new information about existing pages) and could add them straight to the index.

As a negative side effect, gazillions of poor-quality pages flooded the SERPs, objectively deteriorating them. I’m sure the Search Quality team was already working on finding a solution to the poor results in the SERPs, but this particular Caffeine side effect accelerated the creation of the Panda algorithm.

Opening the prophecies book of Cassandra

If you visit the About Us page of Google, you will read this: Google’s mission is to organize the world’s information and make it universally accessible and useful.

That is the why of Google, and three words matter the most:

  1. Organize
  2. Accessible
  3. Useful

The how is represented by its algorithms; the what is composed by all the products Google has developed along the years (Search, Local, Mobile, Video, Voice, etc.).

The Golden Circle of Google

 

Organized

For many years, I’ve considered Google as a cataloguer of objects, which offers information on a variety of topics:

  • Written content

    • “Generic” content
    • Blog posts
    • News
    • Questions/Answers
    • Books
    • Patents
    • Academic articles
  • Photos and images
  • Videos
  • Apps

You’ve probably noticed that these are the vertical searches Google offers and that compose the Universal Search right now (I excluded Products because they are a paid option, and I consider Google+ status as “posts”).

Almost all these “objects” have their own algorithms which are frequently updated, similar to the YouTubeGoogle News, and Google Images algorithm updates. And all them have their flaws (for instance, the real value to be assigned to a link).

Until recent years, Universal Search seemed to be similar to a building constructed with Legos, but there were three important changes in 2011 that changed the landscape. These changed began developing in 2012 and – possibly – will be consolidated in 2013, which could really unify all the vertical searches. These three changes are:

  1. Schema.org
  2. Authorship
  3. Google+

Schema.org

We know that Google is using semantic closeness in its categorization of crawled information, as we know how the concept of Entity is strongly related to semantic.

However, this aspect of the crawling function has assumed a larger relevance after the implementation of Schema.org. The general acceptance of HTML5 as the new standard (pushed by Google since its beginning) and tools like Open Graph has helped boost relevance, as well.

The fact that Google offers the opportunity to verify accuracy of rich snippets implementation (and is continuously updating the tool), changed its name to  the Structured Data testing tool, and recently offered the opportunity to highlight events structured data directly from Webmaster Tools makes me suspect that semantic is going to have even greater weight in how Google will organize the information it crawls.

cbs records inc Knowledge graph

The Knowledge Graph (and open data such as Freebase since Google’s acquisition of Metaweb in 2010), which recently rolled out in many regional Googles, is based on the semantic web and is improving very quickly. My advice is to start thinking seriously about semantic as a channel for possible web marketing and SEO actions

Authorship

AuthorRank has been one of the hot topics of 2012. People smarter than me wrote about it, and even created a tool around it.

In my 2011 “Wake up SEOs, the New Google is here” post, I presented my hypothesis that AuthorRank would have become a part of a more complex set of graphs, whose purpose was to organize and present for real the best content in the SERPs. We have not yet seen that “prophecy” become fact, but I am stubbornly convinced that this is the direction Google has taken. If not, why can we already use the relation ”author”/Google profile in posts, articles, videos, and books? In the future, I see AuthorRank becoming useful in other objects as well, such as photos, images, and audio.

Today, I want to focus on an aspect of AuthorRank which (incomprehensibly) does not receive much attention: the rel=”publisher” mark up. It is rel=”publisher” that connects a site (the publisher) with the authors. Even when those same authors abandon the publisher to start working with another site, their AuthorRank will continue to influence the “PublisherRank,” which makes it even stronger.

Relation between Publisher and Authors

Google+

During the last SMX Social Media Marketing Expo, Vic Gundotra told Danny Sullivan:

“I think people are just now starting to understand that Google+ is Google. At some point, we’re gonna have a billion users. And eventually, you’ll just say: ‘Google has a billion users.’”

I am not going to discuss the value of Google+ as a social media channel, but we must admit that it is the irresistible evolution of Google for a number of reasons:

  • Social is the nature of a profiler tool.
  • The fact that rel=”author” and rel=”publisher” are strictly related to Google profiles makes them part of the future of Google Search (and also Paid Search).
  • It is the easiest way for Google to obtain real social signals, not just through how users act on Google+, but also through users connecting many other social accounts to the their G+ profile.

Google Plus connected accout

Accessible

“You don’t need to be at your desk to need an answer.”

You can find that phrase in the Ten Things We Know to Be True page of Google.com.

Google bought Android Inc. in 2005 and entered in the mobile phone industry in 2008. Why the sudden surge into mobile? Aside from Android being a gold mine, Google’s goal is making information universally accessible. Because more and more people are using mobile for search (especially for local), it was imperative for Google to be a player in the space:

Mobile vs. Desktop local search

Search on mobile is almost synonymous with Local Search, which can also (partly) explain why Google Now has been developed, along with the peculiar design of the mobile version of Google Search.

Google Mobile Search with Local Search iconsTherefore, it’s time to stop thinking of mobile as an option and realize it’s a priority.

For that same reason (strongly connected to the concept of accessibility), Google started giving instructions and suggestions about how to build mobile-optimized sites, with special predilection for the responsive design technology.

Mobile accessibility means also Google Voice Search, and what a surprise , from Knowledge Graph, Schema, and Local Search.

In addition, we can’t forget Project Glass. It is still a Google X project, but has been given to developers to start designing software/apps in preparation for its commercial release predicted for 2014.

Accessibility gives information to users quickly, which explains why site speed is so important to Google – so much that it released mod page speed this last October and updated it just few days ago.

Lastly, WPO (Web Performance Optimization) is not exactly an SEO activity, but it affects SEOs, so it must be considered one of the priority for 2013. The frequently troubled relation between SEOs and developers/web designers will forcedly find a solution in 2013. We will need to start being better communicators and reach them where they are.

Useful

At the end of November, Matt Cutts gave a definition of SEO as Search Experience Optimization in his Google Webmaster Help video series:

Nothing really new, but yet another confirmation that SEO should focus on providing users with useful information.

Panda, Penguin, EMD, Layout Update… all of these updates were aimed at prioritizing the most useful sites available, and punishing those that Google considered useless.

Content marketing (the discipline that helps create useful information) has become a big priority for SEOs. However, there are still so many in our industry who don’t really understand what content marketing really means and how SEO can be implemented into a content strategy. This is not the best place to discuss this topic, but check out the deck below for further explanation.

How to Build SEO into Content Strategy by Jonathon Colman

2013 will see definitive adoption of content marketing into SEO, and those sites that do not integrage marketable content into their strategies will see themselves overtaken in the SERPs.

At the same time, we will also see an increase of content marketing spamming: guest posts as article marketing, infograph-spam, or simply not consistent content actions. Sadly, SEOs tend to screw up a lot of at-first-good-tactics just because of a short-sighted tactical vision we may have. It’s possible that some of the Search Quality Team actions will be focused on those facets of spam, because they already have the tools for doing it.

Usefulness to Google does not just mean “content,” hence filling every kind of site with zombie infographics just because “they are cool” is not the way to go. Usefulness is paired with personalization, as if Google was saying, “We will offer you the opportunity to find the information that is interesting to you based on your previous searches, your interests, the authority of the sources, and where you are.”

For that reason, I consider the Venice update the most underestimated update of 2012. It completely changed the SERPs for almost every kind of query.

Moving forward, I recommend paying close attention to the Gmail Search Field experiment, or why Google is putting effort towards making International and Multilingual SEO easier. 

Cassandra’s appendices: what updates might we see in 2013?

Between 2011 and 2012, we experienced three major updates: Panda, Penguin, and EMD.

The first update’s goal was to clean the SERPs of useless content, defined as content that doesn’t provide any real value to the users. The second aimed to clean the SERPs of content that ranked thanks to a “false popularity” obtained through low-quality link building actions, rather than ranking by value according to users. The third update’s goal was to clean the SERPs of content that ranked solely because of an exaggerated boost obtained from its exact match domain. 

The Penguin and EMD updates were even more necessary after Panda as a logical consequence, if you really think about it. Panda resulted in a large amount of sites disappearing from the SERPs. Other sidtes that survived Panda’s ranking factors still won in the SERPs, mostly due to an over-SEO’d link building strategy. After Penguin, we saw those sites replaced by the sites relying only on the strength of their domain names, leading to the EMD update roll out.

Are the SERPs perfect now? Not quite. Domain crowding (and its counter part, domain shrinking), which was minor issue since 2007 was somehow justified by the Brand Update, is becoming a problem, especially in countries where the EMD update is not yet rolled out. 

MozCast Metrics Domain Diversity

Conclusion 

We know how much can still be done through Rich Snippets spam, the gaming of Local Search, and guest posting and infographic directories spam. In 2013, we may see the effects of a massive collection of spam sites (although Google is working on it, thanks to the disavow tool); could this be the “linkapocalypse,” or maybe even the real “Balrog” update? These are simply my assumptions, as every year when it comes to possible updates. What is sure is that we will see new releases of Panda and Penguin, and the extension of the EMD update in all regional Googles.

This is Google in 2013 for me and I am not a prophet, just someone who likes to look at the past while trying to interpret the future. I am right? Probably not.

But, just maybe, I am Cassandra.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button